# Multi-task Transfer Learning
Oceansar 1
Apache-2.0
OceanSAR-1 is a vision foundation model specifically designed for Synthetic Aperture Radar (SAR) image analysis, particularly suitable for ocean observation tasks.
Image Classification
Transformers

O
galeio-research
117
1
Kf Deberta Base Cross Nli
MIT
A Korean natural language inference model based on the DeBERTa architecture, trained on the kor-nli and klue-nli datasets, supporting zero-shot classification tasks.
Text Classification
Transformers Korean

K
deliciouscat
21
2
T5 V1 1 Xxl
Apache-2.0
T5 1.1 is Google's improved text-to-text Transformer model, employing GEGLU activation function and pure unsupervised pretraining strategy
Large Language Model
Transformers English

T
google
597.64k
116
T5 Base Swedish
Apache-2.0
A Swedish text generation and translation model based on the T5 architecture, suitable for summarization and translation tasks.
Large Language Model Other
T
birgermoell
16
0
T5 V1 1 Base
Apache-2.0
T5 1.1 is Google's improved text-to-text transfer model, utilizing the GEGLU activation function and optimized architecture, focused on unsupervised pretraining
Large Language Model English
T
google
150.73k
58
T5 11b
Apache-2.0
T5-11B is a text-to-text transfer Transformer model developed by Google, with 11 billion parameters, supporting various NLP tasks.
Large Language Model
Transformers Supports Multiple Languages

T
google-t5
147.63k
62
Featured Recommended AI Models